An asymptotically optimal gradient algorithm for quadratic optimization with low computational cost
نویسندگان
چکیده
منابع مشابه
An asymptotically optimal gradient algorithm for quadratic optimization with low computational cost
We consider gradient algorithms for minimizing a quadratic function in R with large n. We suggest a particular sequence of step-lengthes and demonstrate that the resulting gradient algorithm has a convergence rate comparable with that of Conjugate Gradients and other methods based on the use of Krylov spaces. When the problem is large and sparse, the proposed algorithm can be more efficient tha...
متن کاملAn Efficient Conjugate Gradient Algorithm for Unconstrained Optimization Problems
In this paper, an efficient conjugate gradient method for unconstrained optimization is introduced. Parameters of the method are obtained by solving an optimization problem, and using a variant of the modified secant condition. The new conjugate gradient parameter benefits from function information as well as gradient information in each iteration. The proposed method has global convergence und...
متن کاملAn Optimum Algorithm for Single Machine with Early/Tardy Cost
The problem of determining the sequence of a set of jobs with the objective function of minimizing the maximum earliness and tardiness in one machine is studied. Production systems like JIT are one of the many applications of the problem. This problem is studied in special cases and their optimal solutions are introduced with simple orders. In general, some effective conditions for neig...
متن کاملAn Optimum Algorithm for Single Machine with Early/Tardy Cost
The problem of determining the sequence of a set of jobs with the objective function of minimizing the maximum earliness and tardiness in one machine is studied. 
 Production systems like JIT are one of the many applications of the problem. This problem is studied in special cases and their optimal solutions are introduced with simple orders. In general, some effective conditions for ne...
متن کاملAn Interior Point Algorithm for Solving Convex Quadratic Semidefinite Optimization Problems Using a New Kernel Function
In this paper, we consider convex quadratic semidefinite optimization problems and provide a primal-dual Interior Point Method (IPM) based on a new kernel function with a trigonometric barrier term. Iteration complexity of the algorithm is analyzed using some easy to check and mild conditions. Although our proposed kernel function is neither a Self-Regular (SR) fun...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Optimization Letters
سال: 2012
ISSN: 1862-4472,1862-4480
DOI: 10.1007/s11590-012-0491-7